Constant Time Graph Neural Networks

نویسندگان

چکیده

The recent advancements in graph neural networks (GNNs) have led to state-of-the-art performances various applications, including chemo-informatics, question-answering systems, and recommender systems. However, scaling up these methods huge graphs, such as social Web remains a challenge. In particular, the existing for accelerating GNNs either are not theoretically guaranteed terms of approximation error or incurred at least linear time computation cost. this study, we reveal query complexity uniform node sampling scheme Message Passing Neural Networks, GraphSAGE, attention (GATs), convolutional (GCNs). Surprisingly, our analysis reveals that method is completely independent number nodes, edges, neighbors input depends only on tolerance confidence probability while providing theoretical guarantee error. To best knowledge, first article provide within constant time. Through experiments with synthetic real-world datasets, investigated speed precision validated results.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Graph Transformation in Constant Time

We present conditions under which graph transformation rules can be applied in time independent of the size of the input graph: graphs must contain a unique root label, nodes in the left-hand sides of rules must be reachable from the root, and nodes must have a bounded outdegree. We establish a constant upper bound for the time needed to construct all graphs resulting from an application of a f...

متن کامل

Constant time algorithms in sparse graph model

We focus on constant-time algorithms for graph problems in bounded degree model. We introduce several techniques to design constant-time approximation algorithms for problems such as Vertex Cover, Maximum Matching, Maximum Weighted Matching, Maximum Independent Set and Set Cover. Some of our techniques can also be applied to design constant-time testers for minor-closed properties. In Chapter 1...

متن کامل

FINITE-TIME PASSIVITY OF DISCRETE-TIME T-S FUZZY NEURAL NETWORKS WITH TIME-VARYING DELAYS

This paper focuses on the problem of finite-time boundedness and finite-time passivity of discrete-time T-S fuzzy neural networks with time-varying delays. A suitable Lyapunov--Krasovskii functional(LKF) is established to derive sufficient condition for finite-time passivity of discrete-time T-S fuzzy neural networks. The dynamical system is transformed into a T-S fuzzy model with uncertain par...

متن کامل

Kernel Graph Convolutional Neural Networks

Graph kernels have been successfully applied to many graph classification problems. Typically, a kernel is first designed, and then an SVM classifier is trained based on the features defined implicitly by this kernel. This two-stage approach decouples data representation from learning, which is suboptimal. On the other hand, Convolutional Neural Networks (CNNs) have the capability to learn thei...

متن کامل

Gated Graph Sequence Neural Networks

Graph-structured data appears frequently in domains including chemistry, natural language semantics, social networks, and knowledge bases. In this work, we study feature learning techniques for graph-structured inputs. Our starting point is previous work on Graph Neural Networks (Scarselli et al., 2009), which we modify to use gated recurrent units and modern optimization techniques and then ex...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ACM Transactions on Knowledge Discovery From Data

سال: 2022

ISSN: ['1556-472X', '1556-4681']

DOI: https://doi.org/10.1145/3502733